# Multi-task pretraining
Tibert Base
This is a BERT base model pretrained specifically for Tigrinya, trained for 40 epochs on a dataset of 40 million tokens.
Large Language Model Other
T
fgaim
28
1
Code Trans T5 Large Source Code Summarization Python Multitask Finetune
Pretrained model based on T5-large architecture, specifically designed for Python code summarization tasks with multi-task learning support
Text Generation
C
SEBIS
78
13
Featured Recommended AI Models